|
In coding theory, generalized minimum-distance (GMD) decoding provides an efficient algorithm for decoding concatenated codes, which is based on using an errors-and-erasures decoder for the outer code. A naive decoding algorithm for concatenated codes can not be an optimal way of decoding because it does not take into account the information that maximum likelihood decoding (MLD) gives. In other words, in the naive algorithm, inner received codewords are treated the same regardless of the difference between their hamming distances. Intuitively, the outer decoder should place higher confidence in symbols whose inner encodings are close to the received word. David Forney in 1966 devised a better algorithm called generalized minimum distance (GMD) decoding which makes use of those information better. This method is achieved by measuring confidence of each received codeword, and erasing symbols whose confidence is below a desired value. And GMD decoding algorithm was one of the first examples of soft-decision decoders. We will present three versions of the GMD decoding algorithm. The first two will be randomized algorithms while the last one will be a deterministic algorithm. ==Setup== # Hamming distance : Given two vectors the Hamming distance between u and v, denoted by , is defined to be the number of positions in which u and v differ. # Minimum distance : Let be a code. The minimum distance of code C is defined to be where # Code concatenation : Given , consider two codes which we call outer code and inner code , and their distances are and . A concatenated code can be achieved by where . Finally we will take to be RS code, which has an errors and erasure decoder, and , which in turn implies that MLD on the inner code will be poly() time. # Maximum likelihood decoding(MLD) : MLD is a decoding method for error correcting codes, which outputs the codeword closest to the received word in Hamming distance. The MLD function denoted by is defined as follows. For every , . # Probability density function : A probability distribution on a sample space is a mapping from events of to real numbers such that for any event , , and for any two mutually exclusive events and # Expected value : The expected value of a discrete random variable is . 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Generalized minimum-distance decoding」の詳細全文を読む スポンサード リンク
|